Publisher: Sierra
We regard
World in Conflict as one of the best real-time strategy games we've ever played. It's based on Microsoft's DirectX 10 API and, in collaboration with Nvidia's
The Way It's Meant To Be Played developer support team, it incorporates some DirectX 10 specific graphics effects.
The first of these is a soft particle effect that removes the banding often found in particle effects like smoke, explosions, fire and debris - the effects simply didn't exist in the 3D world; instead, they were merely an add-on. With DirectX 10, the edges of the particle effects are much softer and banding is almost non-existent as the effects now interact with their 3D surroundings, as they're actually a part of the 3D world.
Additionally there are global cloud shadowing and volumetric lighting effects in the DirectX 10 version of the game. The latter is often referred to as 'god rays' and its implementation in
World in Conflict interacts with the surroundings incredibly well. On the other hand, the former is where clouds cast shadows on the rest of the environment and, because all clouds in
World in Conflict are volumetric and dynamic, the shadows cast by the clouds are rendered dynamically in DirectX 10 - they adjust in relation to the size, shape and orientation of the cloud in relation to the light source.
For our testing purposes, we used a full retail copy of the game and patched it to version 1.007, which includes a few fixes and some improved performance under DirectX 10. We used a manual run through from the
Invasion level, which incorporates all of the effects we've discussed above. We chose not to use the built-in benchmark because it's largely CPU-limited. We used the "very high" preset, and controlled anti-aliasing and anisotropic filtering via the advanced settings tab.
-
Nvidia GeForce 9800 GX2 1GB
-
Nvidia GeForce 8800 GTX 768MB
-
Nvidia GeForce 9800 GTX 512MB
-
Nvidia GeForce 8800 GTS 512MB
-
AMD ATI Radeon HD 3870 X2 1GB
Frames Per Second
-
Nvidia GeForce 9800 GX2 1GB
-
Nvidia GeForce 9800 GTX 512MB
-
Nvidia GeForce 8800 GTS 512MB
-
AMD ATI Radeon HD 3870 X2 1GB
-
Nvidia GeForce 8800 GTX 768MB
Frames Per Second
-
Nvidia GeForce 9800 GX2 1GB
-
Nvidia GeForce 9800 GTX 512MB
-
Nvidia GeForce 8800 GTX 768MB
-
AMD ATI Radeon HD 3870 X2 1GB
-
Nvidia GeForce 8800 GTS 512MB
Frames Per Second
-
Nvidia GeForce 9800 GX2 1GB
-
AMD ATI Radeon HD 3870 X2 1GB
-
Nvidia GeForce 9800 GTX 512MB
-
Nvidia GeForce 8800 GTX 768MB
-
Nvidia GeForce 8800 GTS 512MB
Frames Per Second
At lower resolutions and with anti-aliasing disabled, the GeForce 9800 GTX looks pretty good against the GeForce 9800 GX2 in
World in Conflict, but that's more than likely down to the fact that the game can be CPU limited at times. In fact, it's not really until you get to 2560x1600 that you see a decent enough performance gap opening up between the two cards. When you get to that resolution, the GeForce 9800 GTX starts to run out of steam, but you can just about enjoy
WiC with 0xAA enabled at 'very high' quality settings.
Without AA enabled, the GeForce 9800 GTX also looks good against the GeForce 8800 GTX, where at 1920x1200 it manages to eke out a seven percent frame rate advantage. That difference quickly disappears when you start increasing the resolution and/or the number of anti-aliasing samples though, as the two cards are fairly similar in performance at 1680x1050 2xAA, 1920x1200 2xAA and 2560x1600 0xAA.
Want to comment? Please log in.